Encouraging Human Operators to Appropriately Rely on Automated Decision Aids

نویسندگان

  • Mary T. Dzindolet
  • Hall P. Beck
  • Linda G. Pierce
چکیده

Information technology is changing the nature of the military decision-making process. However, the underlying assumption in employing human-automated system teams, namely that the team will be more productive than the human or the automated system would be alone, is not always met. Under certain conditions, human operators err by overly relying or under-utilizing automated systems [Parasuraman and Riley, 1997]. A Framework of Automation Use [Dzindolet et al., 1999] posits that cognitive, social, and motivational processes combine to predict automation use. Two studies were performed to examine social processes, controlling for cognitive and motivational processes. The framework posits that when human operators’ estimates of the reliability of the automated system and manual operation are accurate, appropriate use of automation is most likely to occur. Various ways of communicating to human operators information concerning the reliability of their own and an automated aid’s decisions were examined in an effort to encourage human operators to appropriately rely on automated decision aids. Both studies found alarming rates of disuse. However, provision of many sources of information concerning the reliability of the automated decision aid was successful in reducing the bias toward disuse. Results have implications for both training and system design. 1. Automated Decision Aids Dramatic increases in the use of automation in recent years have occurred in the military [cf. Cesar, 1995]. Information technology is changing the nature of the military decision-making process by providing decision makers with more relevant, accurate, and timely information than was previously possible. The underlying assumption in employing these human-automated system “teams” is that the teams will be more productive than either the automated system or the human would be working alone. While some researchers have found support for this underlying assumption [Dalal and Kasper, 1994], others have found human operators often overly rely on (misuse) or underutilize (disuse) automated decision systems [Parasuraman and Riley, 1997]. Some speculate that this inappropriate use is the main reason why automation of decision making has not advanced as far as it has in other areas [Cohen et al., 1999]. By understanding the processes that human operators use when allocating tasks to automated operations, one may be able to better design systems that encourage appropriate use of automated aids. What are the processes leading to the decision to rely on or ignore an automated system? Drawing from Lee and Moray [1992; 1994], Mosier and Skitka [1996], and Shepperd [1993], Report Documentation Page Form Approved OMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE 2000 2. REPORT TYPE 3. DATES COVERED 00-00-2000 to 00-00-2000 4. TITLE AND SUBTITLE Encouraging Human Operators to Appropriately Rely on Automated Decision Aids 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Cameron University,Department of Psychology,2800 Gore Blvd,Lawton,OK,73505 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S) 11. SPONSOR/MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY NOTES The original document contains color images. 14. ABSTRACT 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 10 19a. NAME OF RESPONSIBLE PERSON a. REPORT unclassified b. ABSTRACT unclassified c. THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 Dzindolet, Pierce, Beck, and Dawe [1999] created a framework that predicts automation use from cognitive, motivational, and social processes (see Figure 1). According to the model, trust in the automated system affects automation use. Trust in the Figure 1. Framework of Automation Reliance [Dzindolet et al., 1999] 1.1 Cognitive Processes One reason why people may overly rely on automated systems when making decisions has to do with the manner in which they process the information provided by the automated aid. Rather than going through the cognitive effort of gathering and processing information, the decision supplied by the automated system is used. Often, this strategy is optimal; however, under certain conditions, this reliance may be inappropriate and misuse will occur. Relying on a decision aid in a heuristic manner is dubbed the automation bias [Mosier and Skitka, 1996]. Although the automation bias can explain misuse of automated decision aids, it cannot account for the disuse often found. In order to eliminate the automation bias, participants in the two studies reported in this paper were only provided with the decision of their automated aid after they indicated their decision and their level of confidence in their decision. This procedure prevented participants from relying on the automated aid’s decision in a heuristic manner. After all, they did not even know the automated aid’s decision until after they had made their decision. 1.2 Motivational Processes When working in a group, the responsibility for the group’s product is diffused among the group members. Several researchers have thought of the human-computer system as a dyad or team in which one member is not human [e.g., Bowers et al., 1996]. Thus, the human may feel less responsible for the outcome when working with an automated system than when working alone and may extend less effort. In the social psychological literature, this phenomenon has been dubbed social loafing or free riding [Kerr and Bruun, 1983]. One theory that has been successful in accounting for much of the findings in the social loafing literature is Shepperd’s ExpectancyValue Theory [1993]. According to this theory, motivation is predicted from a function of three factors: expectancy, instrumentality, and outcome value.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Effects of Information Source, Pedigree, and Reliability on Operator Interaction With Decision Support Systems

OBJECTIVE Two experiments are described that examined operators' perceptions of decision aids. BACKGROUND Research has suggested certain biases against automation that influence human interaction with automation. We differentiated preconceived biases from post hoc biases and examined their effects on advice acceptance. METHOD In Study 1 we examined operators' trust in and perceived reliabil...

متن کامل

Effects of Information Source, Pedigree, and Reliability on Operators’ Utilizaton of Diagnostic Advice

Studies have demonstrated that humans appear to apply norms of humanhuman interaction to their interaction with machines. Yet, there exist subtle differences in peoples’ perceptions of automated aids compared to humans. We examined factors differentiating human-human and human-automation interaction, wherein participants (n = 180) performed a luggage-screening task with the assistance of human ...

متن کامل

Trust, situation awareness and automation use: exploring the effect of visual information degradations on human perception and performance in human-telerobot

Today’s military and industry increasingly uses human-robot system to perform complex tasks, such as firefighting. Automated systems that support or even make important decisions require human operators to understand and trust automation in order to rely on it appropriately. This study used a real human-telerobot system performing a firefighting task in an unknown welding room to examine the ef...

متن کامل

A WAVELET-BASED PROCEDURE FOR MINING OF PULSE-LIKE GROUND MOTIONS FEATURES ON RESPONSE SPECTRA

The main objective of this paper is to present a wavelet-based procedure to characterize principle features of a special class of motions called pulse-like ground motions. Initially, continues wavelet transform (CWT) which has been known as a powerful technique both in earthquake engineering and seismology field is applied easily in automated detecting of strong pulse of earthquakes. In this pr...

متن کامل

Supporting Trust Calibration and the Effective Use of Decision Aids by Presenting Dynamic System Confidence Information

OBJECTIVE To examine whether continually updated information about a system's confidence in its ability to perform assigned tasks improves operators' trust calibration in, and use of, an automated decision support system (DSS). BACKGROUND The introduction of decision aids often leads to performance breakdowns that are related to automation bias and trust miscalibration. This can be explained,...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007